Incremental kernel PCA and the Nyström method

نویسندگان

  • Fredrik Hallgren
  • Paul Northrop
چکیده

Incremental versions of batch algorithms are often desired, for increased time efficiency in the streaming data setting, or increased memory efficiency in general. In this paper we present a novel algorithm for incremental kernel PCA, based on rank one updates to the eigendecomposition of the kernel matrix, which is more computationally efficient than comparable existing algorithms. We extend our algorithm to incremental calculation of the Nyström approximation to the kernel matrix, the first such algorithm proposed. Incremental calculation of the Nyström approximation leads to further gains in memory efficiency, and allows for empirical evaluation of when a subset of sufficient size has been obtained.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nyström Approximations for Scalable Face Recognition: A Comparative Study

Kernel principal component analysis (KPCA) is a widelyused statistical method for representation learning, where PCA is performed in reproducing kernel Hilbert space (RKHS) to extract nonlinear features from a set of training examples. Despite the success in various applications including face recognition, KPCA does not scale up well with the sample size, since, as in other kernel methods, it i...

متن کامل

Empirical Evaluation of Kernel PCA Approximation Methods in Classification Tasks

Kernel Principal Component Analysis (KPCA) is a popular dimensionality reduction technique with a wide range of applications. However, it suffers from the problem of poor scalability. Various approximation methods have been proposed in the past to overcome this problem. The Nyström method, Randomized Nonlinear Component Analysis (RNCA) and Streaming Kernel Principal Component Analysis (SKPCA) w...

متن کامل

Less is More: Nyström Computational Regularization

We study Nyström type subsampling approaches to large scale kernel methods, and prove learning bounds in the statistical learning setting, where random sampling and high probability estimates are considered. In particular, we prove that these approaches can achieve optimal learning bounds, provided the subsampling level is suitably chosen. These results suggest a simple incremental variant of N...

متن کامل

Dimension Reduction: A Guided Tour

We give a tutorial overview of several geometric methods for dimension reduction. We divide the methods into projective methods and methods that model the manifold on which the data lies. For projective methods, we review projection pursuit, principal component analysis (PCA), kernel PCA, probabilistic PCA, canonical correlation analysis, oriented PCA, and several techniques for sufficient dime...

متن کامل

Geometric Methods for Feature Extraction and Dimensional Reduction - A Guided Tour

We give a tutorial overview† of several geometric methods for feature selection and dimensional reduction. We divide the methods into projective methods and methods that model the manifold on which the data lies. For projective methods, we review projection pursuit, principal component analysis (PCA), kernel PCA, probabilistic PCA, and oriented PCA; and for the manifold methods, we review multi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1802.00043  شماره 

صفحات  -

تاریخ انتشار 2018